Lower tails via relative entropy

نویسندگان

چکیده

We show that the naive mean-field approximation correctly predicts leading term of logarithmic lower tail probabilities for number copies a given subgraph in G(n,p) and arithmetic progressions length random subsets integers entire range densities where is viable. Our main technical result provides sufficient conditions on maximum degrees uniform hypergraph H guarantee edges, induced by binomial subset vertices H, can be well approximated considering only product distributions. This may interpreted as weak, probabilistic version container lemma applicable to all sparser-than-average (and not independent) sets.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonsubjective Priors via Predictive Relative Entropy

We explore the construction of nonsubjective prior distributions in Bayesian statistics via a posterior predictive relative entropy regret criterion. We carry out a minimax analysis based on a derived asymptotic predictive loss function and show that this approach to prior construction has a number of attractive features. The approach here differs from previous work that uses either prior or po...

متن کامل

Nonsubjective Priors via Predictive Relative Entropy Regret

We explore the construction of nonsubjective prior distributions in Bayesian statistics via a posterior predictive relative entropy regret criterion. We carry out a minimax analysis based on a derived asymptotic predictive loss function and show that this approach to prior construction has a number of attractive features. The approach here differs from previous work that uses either prior or po...

متن کامل

Lower bounds on $q$-wise independence tails and applications to min-entropy condensers

We present novel and sharp lower bounds for higher load moments in the classical problem of mapping M balls into N bins by q-universal hashing, specialized to the case when M = N . As a corollary we prove a tight counterpart for the result about min-entropy condensers due to Dodis, Pietrzak and Wichs (CRYPTO’14), which has found important applications in key derivation. It states that condensin...

متن کامل

Relative entropy via non-sequential recursive pair substitutions

The entropy of an ergodic source is the limit of properly rescaled 1block entropies of sources obtained applying successive non-sequential recursive pairs substitutions [7],[2]. In this paper we prove that the cross entropy and the KullbackLeibler divergence can be obtained in a similar way. AMS classification scheme numbers: 94A17

متن کامل

Heavy Tails, Importance Sampling and Cross–Entropy

We consider the problem of estimating P(Y1 + · · · + Yn > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a tool for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y1 + · · ·+ Yn > x), n− 1 of the Yi have distribution F and one the conditional dist...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Probability

سال: 2023

ISSN: ['0091-1798', '2168-894X']

DOI: https://doi.org/10.1214/22-aop1610